Learning Rates of Least-Square Regularized Regression

نویسندگان

  • Qiang Wu
  • Yiming Ying
  • Ding-Xuan Zhou
چکیده

This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C∞ and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m−ζ with ζ arbitrarily close to 1, regardless of the variance of the bounded probability distribution. Short Title: Least-square Regularized Regression

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Least Square Regression with Spherical Polynomial Kernels

This article considers regularized least square regression on the sphere. It develops a theoretical analysis of the generalization performances of regularized least square regression algorithm with spherical polynomial kernels. The explicit bounds are derived for the excess risk error. The learning rates depend on the eigenvalues of spherical polynomial integral operators and on the dimension o...

متن کامل

Error analysis of regularized least-square regression with Fredholm kernel

Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized ...

متن کامل

Reproducing Kernel Hilbert Spaces in Learning Theory: the Sphere and the Hypercube

We analyze the regularized least square algorithm in learning theory with Reproducing Kernel Hilbert Spaces (RKHS). Explicit convergence rates for the regression and binary classification problems are obtained in particular for the polynomial and Gaussian kernels on the n-dimensional sphere and the hypercube. There are two major ingredients in our approach: (i) a law of large numbers for Hilber...

متن کامل

Optimal Rates for Regularized Least Squares Regression

We establish a new oracle inequality for kernelbased, regularized least squares regression methods, which uses the eigenvalues of the associated integral operator as a complexity measure. We then use this oracle inequality to derive learning rates for these methods. Here, it turns out that these rates are independent of the exponent of the regularization term. Finally, we show that our learning...

متن کامل

Optimal learning rates for least squares SVMs using Gaussian kernels

We prove a new oracle inequality for support vector machines with Gaussian RBF kernels solving the regularized least squares regression problem. To this end, we apply the modulus of smoothness. With the help of the new oracle inequality we then derive learning rates that can also be achieved by a simple data-dependent parameter selection method. Finally, it turns out that our learning rates are...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Foundations of Computational Mathematics

دوره 6  شماره 

صفحات  -

تاریخ انتشار 2006